Thanks to the machinations of the right, there is no dirtier word in American politics today than "liberal." Yet public-opinion polls consistently show that the majority of Americans hold liberal views on everything from health care to foreign policy.